Variational Reference Priors
نویسندگان
چکیده
Posterior distributions are useful for a broad range of tasks in machine learning ranging from model selection to reinforcement learning. Given that modern machine learning models can have millions of parameters, selecting an informative prior is typically infeasible, resulting in widespread use of priors that avoid strong assumptions. For example, recent work on deep generative models (Kingma & Welling, 2014; Rezende et al., 2014) commonly uses the standard Normal distribution for the prior on the latent space. However, just because a prior is relatively flat does not mean it is uninformative. The Jeffreys prior for the Bernoulli model serves as a well-known counter example: Jeffreys (1946) showed that the arcsine distribution, despite its peaks near 0 and 1, is the truly objective prior (with respect to Fisher information) and not the uniform distribution. This suggests that objective priors such as the Jeffreys or the related Reference prior (Bernardo, 2005) are worthy of investigation for high-dimensional, web-scale probabilistic models. However, the challenge is that these priors are difficult to derive for all but the simplest models.
منابع مشابه
Learning Approximately Objective Priors
Informative Bayesian priors are often difficult to elicit, and when this is the case, modelers usually turn to noninformative or objective priors. However, objective priors such as the Jeffreys and reference priors are not tractable to derive for many models of interest. We address this issue by proposing techniques for learning reference prior approximations: we select a parametric family and ...
متن کاملVariational Segmentation with Shape Priors
We discuss the design of shape priors for variational regionbased segmentation. By means of two different approaches, we elucidate the critical design issues involved: representation of shape, use of perceptually plausible dissimilarity measures, Euclidean embedding of shapes, learning of shape appearance from examples, combining shape priors and variational approaches to segmentation. The over...
متن کاملWeakly Convex Coupling Continuous Cuts and Shape Priors
We introduce a novel approach to variational image segmentation with shape priors. Key properties are convexity of the joint energy functional and weak coupling of convex models from different domains by mapping corresponding solutions to a common space. Specifically, we combine total variation based continuous cuts for image segmentation and convex relaxations of Markov Random Field based shap...
متن کاملGibbs Sampling for Logistic Normal Topic Models with Graph-Based Priors
Previous work on probabilistic topic models has either focused on models with relatively simple conjugate priors that support Gibbs sampling or models with non-conjugate priors that typically require variational inference. Gibbs sampling is more accurate than variational inference and better supports the construction of composite models. We present a method for Gibbs sampling in non-conjugate l...
متن کاملVariational Bayesian Multinomial Probit Regression with Gaussian Process Priors
It is well known in the statistics literature that augmenting binary and polychotomous response models with gaussian latent variables enables exact Bayesian analysis viaGibbs sampling from the parameter posterior. By adopting such a data augmentation strategy, dispensing with priors over regression coefficients in favor of gaussian process (GP) priors over functions, and employing variational a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017